Independent components of natural images under variable compression rate

نویسنده

  • Akio Utsugi
چکیده

A generalized ICA model allowing overcomplete bases and additive noises in the observables is applied to natural image data. It is well known that such a model produces independent components that resemble simple cells in primary visual cortex or Gabor functions. We adopt a variable-sparsity density on each independent component, given by the mixture of a delta function and a standard Gaussian density. In the experiment, we observe that the aspect ratios of the optimal bases increase with the noise level and the degree of sparsity. The meaning of this phenomenon is discussed.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ارائه روشی برای پیش‌پردازش تصویر جهت بهبود عملکرد JPEG

A lot of researchs have been performed in image compression and different methods have been proposed. Each of the existing methods presents different compression rates on various images. By identifing the effective parameters in a compression algorithm and strengthen them in the preprocessing stage, the compression rate of the algorithm can be improved. JPEG is one of the successful compression...

متن کامل

Image compression using orthogonalized independent components bases

In this paper we address the orthogonalization of independent component analysis (ICA) to obtain transform-based image coders. We consider several classes of training images, from which we extract the independent components, followed by orthogonalization, obtaining bases for image coding. Experimental tests show the generalization ability of ICA of natural images, and the adaptation ability to ...

متن کامل

Compression of Breast Cancer Images By Principal Component Analysis

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

متن کامل

Compression of Breast Cancer Images By Principal Component Analysis

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

متن کامل

Algorithms for Independent Components Analysis and Higher Order Statistics

A latent variable generative model with finite noise is used to describe several different algorithms for Independent Components Analysis (ICA). In particular, the Fixed Point ICA algorithm is shown to be equivalent to the ExpectationMaximization algorithm for maximum likelihood under certain constraints, allowing the conditions for global convergence to be elucidated. The algorithms can also b...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neurocomputing

دوره 49  شماره 

صفحات  -

تاریخ انتشار 2002